A Recurrent Network with Stochastic Weights

نویسندگان

  • Jieyu Zhao
  • John Shawe-Taylor
چکیده

Stochastic neural networks for global optimization are usually built by introducing random uctuations into the network. A natural method is to use stochastic weights rather than stochastic activation functions. We propose a new model in which each neuron has very simple functionality but all the weights are stochastic. It is shown that the stationary distribution of the network uniquely exists and it is approximately a Boltzmann-Gibbs distribution when the size of the network is not too small. A new technique to implement simulated annealing is proposed. Simulation results on the graph bisection problem show that the power of the network is comparable with that of a Boltzmann machine.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neuro-Optimizer: A New Artificial Intelligent Optimization Tool and Its Application for Robot Optimal Controller Design

The main objective of this paper is to introduce a new intelligent optimization technique that uses a predictioncorrectionstrategy supported by a recurrent neural network for finding a near optimal solution of a givenobjective function. Recently there have been attempts for using artificial neural networks (ANNs) in optimizationproblems and some types of ANNs such as Hopfield network and Boltzm...

متن کامل

Stochastic Neural Networks and Their Solutions to Optimisation Problems

Stochastic neural networks which are a type of recurrent neural networks can be basicly and simply expressed as “the neural networks which are built by introducing random variations into the network”. This randomness comes from one of these usages : applying stochastic transfer functions to network neurons or determining the network weights stochastically. This randomness property makes this ty...

متن کامل

The High-Dimensional Geometry of Binary Neural Networks

Recent research has shown that one can train a neural network with binary weights and activations at train time by augmenting the weights with a high-precision continuous latent variable that accumulates small changes from stochastic gradient descent. However, there is a dearth of work to explain why one can effectively capture the features in data with binary weights and activations. Our main ...

متن کامل

Stochastic Recurrent Neural Network for Speech Recognition

This paper presents a new stochastic learning approach to construct a latent variable model for recurrent neural network (RNN) based speech recognition. A hybrid generative and discriminative stochastic network is implemented to build a deep classification model. In the implementation, we conduct stochastic modeling for hidden states of recurrent neural network based on the variational auto-enc...

متن کامل

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994